33 research outputs found

    Sampling based progressive hedging algorithms for stochastic programming problems

    Get PDF
    Many real-world optimization problems have parameter uncertainty. For instances where the uncertainties can be estimated to a certain degree, stochastic programming (SP) methodologies are used to identify robust plans. Despite advances in SP, it is still a challenge to solve real world stochastic programming problems, in part due to the exponentially increasing number of scenarios. For two-stage and multi-stage problems, the number of scenarios increases exponentially with the number of uncertain parameters, and for multi-stage problems also with the number of decision stages. In the case of large scale mixed integer stochastic problem instances, there are usually two common approaches: approximation methods and decomposition methods. Most common sampling-based approximation (SAA) SP technique is the Monte Carlo sampling-based method. The Progressive Hedging Algorithm (PHA) on the other hand can optimally solve large problems through the decomposition into smaller problem instances. The SAA, while effectively used in many applications, can lead to poor solution quality if the selected sample sizes are not sufficiently large. With larger sample sizes and multi-stage SPs, however, the SAA method is not practical due to the significant computational effort required. In contrast, PHA suffers from the need to solve many sub-problems iteratively which is computationally expensive. In this dissertation, we develop novel SP algorithms integrating sampling based SAA and decomposition based PHA SP methods. The proposed integrated methods are novel in that they marry the complementary aspects of PHA and SAA in terms of exactness and computational efficiency. Further, the developed methods are practical in that they allow the analyst to calibrate the tradeoff between the exactness and speed of attaining a solution. We demonstrate the effectiveness of the developed integrated approaches, Sampling Based Progressive Hedging Algorithm (SBPHA) and Discarding SBPHA (d-SBPHA), over the pure strategies (i.e. SAA or PHA) as well as other commonly used SP methods through extensive experimentation. In addition, we develop alternative hybridization strategies and present results of extensive experiments for these strategies under different uncertainty models. The validation of the methods is demonstrated through Capacitated Reliable facility Location Problem (CRFLP) and Multi-stage stochastic lot-sizing problems

    End-of-life vehicle management: a comprehensive review

    Get PDF
    Waste management is gaining very high importance in recent years. As automotive is one of the most critical sectors worldwide, which is rapidly increasing, the management of end-of-life vehicles (ELVs) gains importance day by day. Due to legislation and new regulations, actors like users, producers, and treatment facilities are being conferred new responsibilities in the ELV management process. Besides, the ELV management is of vital importance for environment conservation, circular economy and sustainable development. All of these reasons are making the ELV management such a crucial issue to study. Today, the ELV management is a well-positioned and emergent research area. However, the available review papers are focused only on a small area of the ELV management, such as reverse logistics, recovery infrastructure, disassemblability, etc. Besides, a review of state-of-the-art mathematical models for the ELV management is still missing. This paper aims to provide an extensive content analysis overview of studies on the ELV management. A total of 232 studies published in the period 2000-2019 are collected, categorized, reviewed and analyzed. A critical review of the published literature is provided. Gaps in the literature are identified to clarify and suggest future research directions. This review can provide a source of references, valuable insights, and opportunities for researchers interested in the ELV management and inspire their additional attention

    A stochastic mathematical model to locate field hospitals under disruption uncertainty for large-scale disaster preparedness

    No full text
    In this study, we consider field hospital location decisions for emergency treatment points in response to large scale disasters. Specifically, we developed a two-stage stochastic model that determines the number and locations of field hospitals and the allocation of injured victims to these field hospitals. Our model considers the locations as well as the failings of the existing public hospitals while deciding on the location of field hospitals that are anticipated to be opened. The model that we developed is a variant of the P-median location model and it integrates capacity restrictions both on field hospitals that are planned to be opened and the disruptions that occur in existing public hospitals. We conducted experiments to demonstrate how the proposed model can be utilized in practice in a real life problem case scenario. Results show the effects of the failings of existing hospitals, the level of failure probability and the capacity of projected field hospitals to deal with the assessment of any given emergency treatment system’s performance. Crucially, it also specifically provides an assessment on the average distance within which a victim needs to be transferred in order to be treated properly and then from this assessment, the proportion of total satisfied demand is then calculated

    A fuzzy-based multi-dimensional and multi-period service quality evaluation outline for rail transit systems

    No full text
    As a public transportation mode, rail transit systems are one of the most preferred modes to avoid traffic congestion, especially during the rush hours. This paper proposes a service quality evaluation outline to measure rail transit lines’ performances via passenger satisfaction surveys. The proposed method combines statistical analysis, fuzzy trapezoidal numbers and TOPSIS to evaluate service quality levels for multi periods. In total 17,769 surveys that are conducted in Istanbul in 2012, 2013, and 2014 are considered to determine the factors need to be improved. We provide recommendations to enhance the operation for specific lines and guidelines for future investments

    Analyzing the efficiency of bank branches

    No full text
    As of 21st century, the terms of efficiency and productivity have become notions which dwells on both business and academic world more frequently compared to past. It is known that it is hard to increase the efficiency and productivity of both production and service systems. In this study, the efficiency analysis of the branches of a bank was conducted. Furthermore, a Weighted Stochastic Imprecise Data Envelopment Analysis (WSIDEA), which is a new approach developed based on Data Envelopment Analysis (DEA), was proposed. Efficiency levels and results of decision-making units were examined according to the proposed new method. Additionally, six different DEA model results are obtained. The results of the six different DEA model and the proposed “WSIDEA” model were compared in terms of efficiency level of decision-making units, and the differences between them were examined. Sensitivity of the inefficient units were also examined. On the other hand, unrealistic efficiency levels created by traditional methods for branches were also analyzed. Apart from all these sensitivity analyses, the sensitivity of the data set used in the analysis is scrutinized

    A stochastic location and allocation model for critical items to response large-scale emergencies: A case of Turkey

    No full text
    This paper aims to decide on the number of facilities and their locations, procurement for pre and post-disaster, and allocation to mitigate the effects of large-scale emergencies. A two-stage stochastic mixed integer programming model is proposed that combines facility location- prepositioning, decisions on pre-stocking levels for emergency supplies, and allocation of located distribution centers (DCs) to affected locations and distribution of those supplies to several demand locations after large-scale emergencies with uncertainty in demand. Also, the use of the model is demonstrated through a case study for prepositioning of supplies in probable large-scale emergencies in the eastern and southeastern Anatolian sides of Turkey. The results provide a framework for relief organizations to determine the location and number of DCs in different settings, by using the proposed model considering the main parameters, as; capacity of facilities, probability of being affected for each demand points, severity of events, maximum distance between a demand point and distribution center

    A multiattribute customer satisfaction evaluation approach for rail transit network: A real case study for Istanbul, Turkey

    No full text
    Rail transit is one of the most important public transportation types, especially in big and crowded cities. Therefore, getting a high customer satisfaction level is an essential task for municipalities and governments. For this purpose, a survey is conducted to question the attributes related to rail transit network (metros, trams, light rail and funicular) in Istanbul. In this study, we present a novel framework which integrates statistical analysis, SERVQUAL, interval type-2 fuzzy sets and VIKOR to evaluate customer satisfaction level for the rail transit network of Istanbul. Level of crowdedness and density in the train, air-conditioning system of trains\u27 interior, noise level and vibration during the journey, and phone services are determined as the attributes need improvements. On the other hand, different improvement strategies are suggested for the rail transit network. The proposed approach provides directions for the future investments and can be generalized and applied to complex decision making problems encounter inexact, indefinite and subjective data or uncertain information
    corecore